Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
English Continuous Pretraining
# English Continuous Pretraining
Llama 3 6B V0.1
The world's first 6-billion-parameter Llama-3 base model, created using the downgrade loop technique from Meta-Llama-3-8B and continuously pretrained on 1 billion English text tokens
Large Language Model
Transformers
English
L
prince-canuma
14
14
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase